Structured Sparsity: Discrete and Convex approaches
نویسندگان
چکیده
Compressive sensing (CS) exploits sparsity to recover sparse or compressible signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity is also used to enhance interpretability in machine learning and statistics applications: While the ambient dimension is vast in modern data analysis problems, the relevant information therein typically resides in a much lower dimensional space. However, many solutions proposed nowadays do not leverage the true underlying structure. Recent results in CS extend the simple sparsity idea to more sophisticated structured sparsity models, which describe the interdependency between the nonzero components of a signal, allowing to increase the interpretability of the results and lead to better recovery performance. In order to better understand the impact of structured sparsity, in this chapter we analyze the connections between the discrete models and their convex relaxations, highlighting their relative advantages. We start with the general group sparse model and then elaborate on two important special cases: the dispersive and the hierarchical models. For each, we present the models in their discrete nature, discuss how to solve the ensuing discrete problems and then describe convex relaxations. We also consider more general structures as defined by set functions and present their convex proxies. Further, we discuss efficient optimization solutions for structured sparsity problems and illustrate structured sparsity in action via three applications.
منابع مشابه
Factorized Latent Spaces with Structured Sparsity
Recent approaches to multi-view learning have shown that factorizing the information into parts that are shared across all views and parts that are private to each view could effectively account for the dependencies and independencies between the different input modalities. Unfortunately, these approaches involve minimizing non-convex objective functions. In this paper, we propose an approach t...
متن کاملSolving Structured Sparsity Regularization with Proximal Methods
Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the !1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of !1 regularization, namely structured sparsity regularizat...
متن کاملOnline Dictionary Learning with Group Structure Inducing Norms
• Sparse coding. • Structured sparsity (e.g., disjunct groups, trees): increased performance in several applications. • Our goal: develop a dictionary learning method, which – enables general overlapping group structures, – is online: fast, memory efficient, adaptive, – applies non-convex sparsity inducing regularization: ∗ fewer measurements, ∗ weaker conditions on the dictionary, ∗ robust (w....
متن کاملBayesian Sparsity for Intractable Distributions
Bayesian approaches for single-variable and group-structured sparsity outperform L1 regularization, but are challenging to apply to large, potentially intractable models. Here we show how noncentered parameterizations, a common trick for improving the efficiency of exact inference in hierarchical models, can similarly improve the accuracy of variational approximations. We develop this with two ...
متن کاملA Fast Algorithm for Separated Sparsity via Perturbed Lagrangians
Sparsity-based methods are widely used in machine learning, statistics, and signal processing. Thereis now a rich class of structured sparsity approaches that expand the modeling power of the sparsityparadigm and incorporate constraints such as group sparsity, graph sparsity, or hierarchical sparsity. Whilethese sparsity models offer improved sample complexity and better interpr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1507.05367 شماره
صفحات -
تاریخ انتشار 2015